Reinforcement Learning with Kernel Recursive Least-Squares Support Vector Machine
نویسندگان
چکیده
منابع مشابه
Least Squares Support Vector Machine for Constitutive Modeling of Clay
Constitutive modeling of clay is an important research in geotechnical engineering. It is difficult to use precise mathematical expressions to approximate stress-strain relationship of clay. Artificial neural network (ANN) and support vector machine (SVM) have been successfully used in constitutive modeling of clay. However, generalization ability of ANN has some limitations, and application of...
متن کاملKernel Recursive Least Squares
We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared-error regressor. Sparsity (and therefore regularization) of the solution is achieved by an explicit greedy sparsification proces...
متن کاملSparse least squares Support Vector Machine classifiers
In least squares support vector machine (LS-SVM) classi-ers the original SVM formulation of Vapnik is modiied by considering equalit y constraints within a form of ridge regression instead of inequality constraints. As a result the solution follows from solving a set of linear equations instead of a quadratic programming problem. Ho wever, a d r a wback is that sparseness is lost in the LS-SVM ...
متن کاملHierarchic Kernel Recursive Least-Squares
We present a new hierarchic kernel based modeling technique for modeling evenly distributed multidimensional datasets that does not rely on input space sparsification. The presented method reorganizes the typical single-layer kernel based model in a hierarchical structure, such that the weights of a kernel model over each dimension are modeled over the adjacent dimension. We show that the impos...
متن کاملSparse Least Squares Support Vector Machine Classiiers
In least squares support vector machine (LS-SVM) classi-ers the original SVM formulation of Vapnik is modiied by considering equality constraints within a form of ridge regression instead of inequality constraints. As a result the solution follows from solving a set of linear equations instead of a quadratic programming problem. However, a drawback is that sparseness is lost in the LS-SVM case ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Machine Learning and Computing
سال: 2012
ISSN: 2010-3700
DOI: 10.7763/ijmlc.2012.v2.201